Web Survey Bibliography
Online surveys offer a lack of social interaction. In the relevant literature this is most often seen as an advantage (e.g. Tourangeau, Rips, & Rasinski, 2000). Besides being a potential source of bias, human communication is also a source of motivation. The absence of it could therefore be a contributing factor for the intrinsically high drop-out rates of online surveys.
The Social Interface Theory (e.g. Nass, Moon, & Green, 1997) states that even marginal human cues in an interface can induce behaviours that are commonly found in face-to-face contact. Within an experimental setting (n=2046), we attempted to simulate this social interaction in an online survey.
This was carried out by 1) addressing the participants directly with additional texts on several pages of the questionnaire. Within these texts, participants were, for example, thanked for their participation or shown appreciation for answering long questions. As further experimental conditions, the texts were either presented by 2) a static or 3) an animated virtual person (avatar). One of ten avatars, which varied in gender and age, was chosen at random. Additionally, using a 3x2 design participants were addressed personally using their names. A condition without additional texts acted as the control condition.
No significant influences on drop-out behaviour (rate, time or point of drop-out) and answer behaviour (e.g. social desirability, indicators of data quality or questionnaire ratings) can be found. Significant interaction effects between the gender of the avatar and that of the participant can be seen for some questions concerning gender roles. Further to this, it can be seen that males, younger participants and those participants with a high level of social discomfort react to the human cues more negatively.
One possible reason that the results do not confirm the hypotheses could result from the very high interest in the topic. This interest could also explain the very low drop-out rate, despite the length of the questionnaire (approximately 30 minutes). For this reason, participants maybe considered the additional motivation attempts superfluous.
Online-Befragungen weisen einen Mangel an sozialer Interaktion auf. In der einschlägigen Literatur wird dies zumeist als Vorteil angesehen (vgl. z.B. Tourangeau, Rips & Rasinski, 2000). Unseres Erachtens darf man die menschliche Kommunikation jedoch nicht einseitig als Verzerrungsquelle sehen; sie dient auch als Motivationsquelle. Ihr Fehlen könnte also mitverantwortlich für die internetimmanenten erhöhten Abbruchquoten sein.
Die Social Interface Theory (vgl. z.B. Nass, Moon & Green, 1997) sagt aus, dass bereits marginale menschliche Hinweisreize im Interface Verhaltensweisen hervorrufen, die auch im Face-to-Face-Kontakt gezeigt werden. Auf diesen Effekt vertrauend, versuchten wir in einem experimentellen Setting (n=2046) soziale Interaktion in einer Online-Befragung zu simulieren.
Dies erfolgte 1) durch eine direkte Ansprache der Befragten. In diesen zusätzlichen Texten, die auf mehreren Fragebogenseiten eingeblendet wurden, wurde den Befragten beispielsweise für ihre Teilnahmebereitschaft gedankt oder Verständnis gezeigt, wenn es galt, umfangreiche Matrixfragen zu beantworten. Als Abstufungen wurden die Texte entweder von 2) einem statischen oder 3) einem animierten virtuellen Menschen (Avatar) präsentiert. Per Zufall wurde dabei jeweils einer von zehn verschiedenen Avataren ausgewählt, die sich unter anderem in ihrem Geschlecht und Alter unterschieden. Zudem erfolgte in einem 3x2-Design eine Ansprache der Befragten mit ihrem Namen. Eine Bedingung ohne zusätzliche Texte fungierte als Kontrollbedingung.
Es kann weder ein signifikanter Einfluss auf das Abbruchverhalten (Quote, Zeitpunkt sowie Ort des Abbruchs) noch auf wichtige Aspekte des Antwortverhaltens (u.a. Soziale Erwünschtheit, Indikatoren der Datenqualität sowie Bewertung des Fragebogens) festgestellt werden. Es finden sich signifikante Interaktionseffekte zwischen dem Geschlecht der Avatare und dem Geschlecht der Befragten in Bezug auf einige Fragen zu Geschlechterrollen. Zudem zeigt sich, dass Befragte mit einem hohen sozialen Diskomfort negativer auf die menschlichen Hinweisreize reagieren als Personen, die angeben, sich in Gegenwart Fremder wohl zu fühlen. Ähnliches gilt für männliche sowie jüngere Befragte.
Die nicht hypothesenkonformen Ergebnisse führen wir unter anderem darauf zurück, dass das Interesse der Befragten an dem Thema sehr hoch war. Dies zeigt sich auch in der, trotz hoher Fragebogenlänge (ca. 30 Minuten), sehr geringen Abbruchquote. Zusätzliche Motivierungsversuche wurden deshalb ggf. als überflüssig erachtet.
General online research (GOR) 2008 (abstract)
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.